12,730 research outputs found

    The Calculation of Chemical and Ionization Equilibrium in a Conventional Shock Tube Scientific Report No. 12

    Get PDF
    Calculating chemical and ionization equilibrium of plasma in conventional shock tub

    A History of Miranda and Why It Remains Vital Today

    Get PDF

    Structural and lithologic study of northern California Coast Range and Sacramento Valley, California

    Get PDF
    The author has identified the following significant results. Photgeologic examination of repetitive multispectral ERTS-1 imagery of Northern California has disclosed several systems of linear features which may be important for the interpretation of the structural history of California. They are separated from an orthogonal system of linears in the Klamath Mts. by a set of discontinuous southeast-trending linear features (the Mendocino system) which is traceable from the Pacific Coast, at Cape Mendocino, into the eastern foothills of the Sierra Nevada. Within the Sierra Nevada, the Mendocino system separates the north-trending Sierran system from a set of linears characteristic of the Modoc Plateau. With minor exception, little overlap exists among the systems which suggests a decipherable chronology and evolutionary history for the region. The San Andres system of linears appears to truncate or co-exist with most of the other systems in the northern Coast Ranges. The Mendocino system truncates the Klamath, Sierran, and Modoc systems. The Sierran system may represent fundamental and long-persisting pre-late Paleozoic zones of crustal weakness which have been reactivated from time to time. The Mendocino system was possibly developed in early Mesozoic and is important to the structural framework of Northern California

    Scalable Population Synthesis with Deep Generative Modeling

    Full text link
    Population synthesis is concerned with the generation of synthetic yet realistic representations of populations. It is a fundamental problem in the modeling of transport where the synthetic populations of micro-agents represent a key input to most agent-based models. In this paper, a new methodological framework for how to 'grow' pools of micro-agents is presented. The model framework adopts a deep generative modeling approach from machine learning based on a Variational Autoencoder (VAE). Compared to the previous population synthesis approaches, including Iterative Proportional Fitting (IPF), Gibbs sampling and traditional generative models such as Bayesian Networks or Hidden Markov Models, the proposed method allows fitting the full joint distribution for high dimensions. The proposed methodology is compared with a conventional Gibbs sampler and a Bayesian Network by using a large-scale Danish trip diary. It is shown that, while these two methods outperform the VAE in the low-dimensional case, they both suffer from scalability issues when the number of modeled attributes increases. It is also shown that the Gibbs sampler essentially replicates the agents from the original sample when the required conditional distributions are estimated as frequency tables. In contrast, the VAE allows addressing the problem of sampling zeros by generating agents that are virtually different from those in the original data but have similar statistical properties. The presented approach can support agent-based modeling at all levels by enabling richer synthetic populations with smaller zones and more detailed individual characteristics.Comment: 27 pages, 15 figures, 4 table

    The Disciplined Use of Simplifying Assumptions

    Get PDF
    Submitted to the ACM SIGSOFT Second Software Engineering Symposium: Workshop on Rapid Prototyping. Columbia, Maryland, April 19-21, 1982.Simplifying assumptions ā€” everyone uses them but no one's programming tool explicitly supports them. In programming, as in other kinds of engineering design, simplifying assumptions are an important method for dealing with complexity. Given a complex programming problem, expert programmers typically choose simplifying assumptions which, though false, allow them to arrive rapidly at a program which addresses the important features of the problem without being distracted by all of its details. The simplifying assumptions are then incrementally retracted with corresponding modifications to the initial program. This methodology is particularly applicable to rapid prototyping because the main questions of interest can often be answered using only the initial program. Simplifying assumptions can easily be misused. In order to use them effectively two key issues must be addressed. First, simplifying assumptions should be chosen which simplify the design problems significantly without changing the essential character of the program which needs to be implemented. Second, the designer must keep track of all the assumptions he is making so that he can later retract them in an orderly manner. By explicitly dealing with these issues, a programming assistant system could directly support the use of simplifying assumptions as a disciplined part of the software development process.MIT Artificial Intelligence Laborator

    Introduction: The Challenge of Risk Communication in a Democratic Society

    Get PDF
    The symposium editors review key issues concerning the relationship between risk communication and public participation

    Yeast cytochrome c oxidase: a model system to study mitochondrial forms of the haem-copper oxidase superfamily.

    Get PDF
    The known subunits of yeast mitochondrial cytochrome c oxidase are reviewed. The structures of all eleven of its subunits are explored by building homology models based on the published structures of the homologous bovine subunits and similarities and differences are highlighted, particularly of the core functional subunit I. Yeast genetic techniques to enable introduction of mutations into the three core mitochondrially-encoded subunits are reviewed
    • ā€¦
    corecore